Armijo Newton method for convex best interpolation

نویسندگان

  • Hou-Duo Qi
  • Xiaoqi Yang
چکیده

More than a decade agao, Newton’s method has been proposed for constructing the convex best interpolant. Its local quadratic convergence has only been established recently by recasting it as the generalized Newton method for semismooth equations. It still remains mysterious that the Newton method coupled with line search strategies works practically well in global sense. Similar to the classical Newton method, the Newton matrix far from the solution may be singular or near singular, posing a great deal of difficulties in proving the global convergence of the Newton method with line search. By employing the objective function of Lagrange dual problem, it is observed that whenever the Newton matrix is near singular at some point, one can easily find a nearby point which has well-conditioned Newton matrix and a lower function value. Based on this fact, Newton’s method with Armijo line search is shown to be globally convergent as well as locally quadratically convergent. And in an important case, it also has finite termination property. Numerical results demonstrate the efficiency of the proposed method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quadratic Convergence of Newton’s Method for Convex Interpolation and Smoothing

In this paper, we prove that Newton’s method for convex best interpolation is locally q-quadratically convergent, giving an answer to a question of Irvine, Marine and Smith [7] and strengthening a result of Andersson and Elfving [1] and our previous work [5]. A damped Newton-type method is presented which has global q-quadratic convergence. Analogous results are obtained for the convex smoothin...

متن کامل

Block Bfgs Methods

We introduce a quasi-Newton method with block updates called Block BFGS. We show that this method, performed with inexact Armijo-Wolfe line searches, converges globally and superlinearly under the same convexity assumptions as BFGS. We also show that Block BFGS is globally convergent to a stationary point when applied to non-convex functions with bounded Hessian, and discuss other modifications...

متن کامل

A Finite Newton Method for Classification Problems

A fundamental classification problem of data mining and machine learning is that of minimizing a strongly convex, piecewise quadratic function on the n-dimensional real space Rn. We show finite termination of a Newton method to the unique global solution starting from any point in Rn. If the function is well conditioned, then no stepsize is required from the start, and if not, an Armijo stepsiz...

متن کامل

A quadratically convergent Newton method for vector optimization

We propose a Newton method for solving smooth unconstrained vector optimization problems under partial orders induced by general closed convex pointed cones. The method extends the one proposed by Fliege, Graña Drummond and Svaiter for multicriteria, which in turn is an extension of the classical Newton method for scalar optimization. The steplength is chosen by means of an Armijo-like rule, gu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Optimization Methods and Software

دوره 21  شماره 

صفحات  -

تاریخ انتشار 2006